Web Survey Bibliography
Collecting information from sampled units over the Internet or by mail is much more cost‒efficient than conducting interviews. These methods make self‒enumeration an attractive data‒collection method for surveys and censuses. Despite the benefits associated with self‒enumeration data collection—in particular Internet-based data collection—self‒enumerationcan produce low response rates compared to interviews. To increase response rates, non‒respondents are subject to a mixedmode of follow‒up treatments, which influence the resulting probability of response, to encourage them to participate. Because response occurrence is intrinsically conditional, we preliminary record response occurrence in discrete intervals, and we then characterize the probability of response by a discrete time hazard. This approach facilitates examining when a response is most likely to occur and how the probability of responding varies over both time and follow‒up treatments. Weuse regression analysis to investigate the effect of mixed‒mode on the response probability. Factors and interactions arecommonly treated in regression analyses, and have important implications for the interpretation of statistical models. The nonresponse bias can be avoided by multiplying the sampling weight of respondents by the inverse of an estimate of the response probability. Estimators and associated variance estimators of model parameters as well as of parameters of interest are studied. We take into account correlation over time for the same unit in variance estimation. The problem of optimal resources allocation within stages of the survey design is also investigated.Collecting information from sampled units over the Internet or by mail is much more cost‒efficient than conducting interviews. These methods make self‒enumeration an attractive data‒collection method for surveys and censuses. Despite the benefits associated with self‒enumeration data collection—in particular Internet-based data collection self‒enumerationcan produce low response rates compared to interviews. To increase response rates, non‒respondents are subject to a mixedmode of follow‒up treatments, which influence the resulting probability of response, to encourage them to participate. Because response occurrenceis intrinsically conditional, we preliminary record response occurrence in discrete intervals, and we then characterize the probability of response by a discrete time hazard. This approach facilitates examining when a response is most likely to occur and how the probability of responding varies over both time and follow‒up treatments. Weuse regression analysis to investigate the effect of mixed‒mode on the response probability. Factors and interactions are commonly treated in regression analyses, and have important implications for the interpretation of statistical models. The nonresponse bias can be avoided by multiplying the sampling weight of respondents by the inverse of an estimate of the response probability. Estimators and associated variance estimators of model parameters as well as of parameters of interest are studied. We take into account correlation over time for the same unit in variance estimation. The problem of optimal resources allocation within stages of the survey design is also investigated.
Web survey bibliography - 2015 (291)
- Effects of Mobile versus PC Web on Survey Response Quality: a Crossover Experiment in a Probability...; 2017; Antoun, C.; Couper, M. P.; G. G.Conrad, F. G.
- When will Nonprobability Surveys Mirror Probability Surveys? Considering Types of Inference and Weighting...; 2016; Pasek, J.
- Distractions: The Incidence and Consequences of Interruptions for Survey Respondents ; 2016; Ansolabehere, S.; Schaffner, B. F.
- The Effect of CATI Questions, Respondents, and Interviewers on Response Time; 2016; Olson, K.; Smyth, J. D.
- Linearization Variance Estimators for Mixed ‒ mode Survey Data when Response Indicators are Modeled...; 2016; Demnati, A.
- Adaptive survey designs to minimize survey mode effects – a case study on the Dutch Labor Force...; 2016; Calinescu, M.; Schouten, B.
- What is the gain in a probability-based online panel to provide Internet access to sampling units that...; 2016; Revilla, M.; Cornilleau, A.; Cousteaux, A-S.; Legleye, S; de Pedraza, P.
- Representative web-survey!; 2016; Linde, P.
- Assessing targeted approach letters: effects in different modes on response rates, response speed and...; 2016; Lynn, P.
- New Generation of Online Questionnaires?; 2016; Revilla, M.; Ochoa, C.; Turbina, A.
- The Analysis of Respondent’s Behavior toward Edit Messages in a Web Survey; 2016; Park, Y.
- Refining the Web Response Option in the Multiple Mode Collection of the American Community Survey; 2016; Hughes, T.; Tancreto, J.
- The Utility of an Online Convenience Panel for Reaching Rare and Dispersed Populations; 2016; Sell, R.; Goldberg, S.; Conron, K.
- Setting Up an Online Panel Representative of the General Population The German Internet Panel; 2016; Blom, A. G.; Gathmann, C.; Krieger, U.
- Implementation of Web-Based Respondent Driven Sampling among Men Who Have Sex with Men in Sweden; 2016; Stroemdahl, S.; Lu, X.; Bengtsson, L.; Liljeros, F.; Thorson, A.
- Recommended Practices for the design of business surveys questionnaires; 2016; Macchia, S.
- Web-based versus Paper-based Survey Data: An Estimation of Road Users’ Value of Travel Time Savings...; 2016; Kato, H.; Sakashita, A.; Tsuchiya, Tak.
- Reminder Effect and Data Usability on Web Questionnaire Survey for University Students; 2016; Oishi, T.; Mori, M.; Takata, E.
- Feasibility of using a multilingual web survey in studying the health of ethnic minority youth.; 2016; Kinnunen, J. M.; Malin, M.; Raisamo, S. U.; Lindfors, P. L.; Pere, L. A.; Rimpelae, A. H.
- Respondents of a follow-up web-based survey; 2016; Stoddard, S. A.; Amparo, P.; Popick, H.; Yudd, R.; Sujeer, A.; Baath, M.
- Is One More Reminder Worth It? If So, Pick Up the Phone: Findings from a Web Survey; 2016; Lin-Freeman, L.
- Reducing Underreports of Behaviors in Retrospective Surveys: The Effects of Three Different Strategies...; 2016; Lugtig, P. J.; Glasner, T.; Boeve, A.
- What drives the participation in a monthly research web panel? The experience of ELIPSS, a French random...; 2016; Legleye, S; Cornilleau, A.; Razakamanana, N.
- When Should I Call You? An Analysis of Differences in Demographics and Responses According to Respondents...; 2016; Vicente, P.; Lopes, I.
- The use and positioning of clarification features in web surveys; 2016; Metzler, A., Kunz, T., Fuchs, M.
- Online Surveys are Mixed-Device Surveys. Issues Associated with the Use of Different (Mobile) Devices...; 2016; Toepoel, V.; Lugtig, P. J.
- Mail merge can be used to create personalized questionnaires in complex surveys. ; 2016; Taljaard, M.; Chaudhry, S. H.; Brehaut, J. C.; Weijer, C.; Grimshaw, J. M.
- Electronic and paper based data collection methods in library and information science research: A comparative...; 2016; Tella, A.
- Stable Relationships, Stable Participation? The Effects of Partnership Dissolution and Changes in Relationship...; 2016; Mueller, B.; Castiglioni, L.
- Identifying Pertinent Variables for Nonresponse Follow-Up Surveys. Lessons Learned from 4 Cases in Switzerland...; 2016; Vandenplas, C.; Joye, D.; Staehli, M. E.; Pollien, A.
- The 2013 Census Test: Piloting Methods to Reduce 2020 Census Costs; 2016; Walejko, G. K.; Miller, P. V.
- Methods can matter: Where Web surveys produce different results than phone interviews; 2016; Keeter, S.
- Sunday shopping – The case of three surveys; 2016; Bethlehem, J.
- Will They Stay or Will They Go? Personality Predictors of Dropout in Online Study; 2016; Nestler, S.; Thielsch, M.; Vasilev, E.; Back, M.
- HUFFPOLLSTER: Why Reaching Latinos Is A Challenge For Pollsters; 2016; Jackson, N. M.; Edwards-Levy, A.; Velencia, J.
- Comprehension and engagement in survey interviews with virtual agents; 2016; Conrad, F. G.; Schober, M. F.; Jans, M.; Orlowski, R. A.; Nielsen, D.; Levenstein, R. M.
- Revisiting “yes/no” versus “check all that apply”: Results from a mixed modes...; 2016; Nicolaas, G.; Campanelli, P.; Hope, S.; Jaeckle, A.; Lynn, P.
- Moderators of Candidate Name-Order Effects in Elections: An Experiment; 2016; Kim, Nu.; Krosnick, J. A.; Casasanto, D.
- Predictive inference for non-probability samples: a simulation study ; 2016; Buelens, B.; Burger, J.; van den Brakel, J.
- Equivalence of paper-and-pencil and computerized self-report surveys in older adults; 2016; Weigold, A.; Weigold, I. K.; Drakeford, M. K.; Dykema, S. A.; Smith, C. A.
- Quality of Different Scales in an Online Survey in Mexico and Colombia; 2016; Revilla, M.; Ochoa, C.
- Swapping bricks for clicks: Crowdsourcing longitudinal data on Amazon Turk; 2016; Daly, T. M.; Nataraajan, R.
- A reliability analysis of Mechanical Turk data; 2016; Rouse, S. V.
- Quota Controls in Survey Research.; 2016; Gittelman, S. H.; Thomas, R. K.; Lavrakas, P. J.; Lange, V.
- Computers, Tablets, and Smart Phones: The Truth About Web-based Surveys; 2016; Merle, P.; Gearhart, S.; Craig, C.; Vandyke, M.; Brooks, M. E.; Rahimi, M.
- Scientific Surveys Based on Incomplete Sampling Frames and High Rates of Nonresponse; 2016; Fahimi, M.; Barlas, F. M.; Thomas, R. K.; Buttermore, N. R.
- Taming Big Data: Using App Technology to Study Organizational Behavior on Social Media; 2015; Bail, C. A.
- The Use of a Nonprobability Internet Panel to Monitor Sexual and Reproductive Health in the General...; 2015; Legleye, S; Charrance, G.; Razafindratsima, N.; Bajos, N.; Bohet, A.; Moreau, C.
- Adapting Labour Force Survey questions from interviewer-administered modes for web self-completion in...; 2015; Betts, P.; Cubbon, B.
- ESOMAR/GRBN Online Research Guideline; 2015